- Stop plugging these 7 devices into extension cords - even if they sound like a good idea
- I changed these 6 Samsung TV settings to give the picture quality an instant boost
- I tested a 9,000,000mAh battery pack from eBay that cost $10 - here's my verdict
- The 3 most Windows-like Linux distros to try because change is hard
- This 'unlimited battery' GPS tracker is an integral part of my hikes - and it's on sale
Microsoft’s Security Copilot Enters General Availability
Microsoft Security Copilot, also referred to as Copilot for Security, will be in general availability starting April 1, the company announced today. Microsoft revealed that pricing for Security Copilot will start at $4/hr, calculated based on usage.
At a press briefing on March 7 at the Microsoft Experience Center in New York (Figure A), we saw how Microsoft positions Security Copilot as a way for security personnel to get real-time assistance with their work and pull data from across Microsoft’s suite of security services.
Microsoft Security Copilot availability and pricing
Security Copilot was first announced in March 2023, and general early access opened in October 2023. General availability will be worldwide, and the Security Copilot user interface comes in 25 different languages. Security Copilot can process prompts and respond in eight different languages.
Security Copilot will be sold through a consumptive pricing model, with customers paying based on their needs. Usage will be broken down into Security Compute Units. Customers will be billed monthly for the number of SCUs provisioned hourly at the rate of $4 per hour with a minimum of one hour of use. Microsoft frames this as a way to allow users to start experimenting with Security Copilot and then scale up as needed.
How Microsoft Security Copilot assists security professionals
Security Copilot can work as a standalone application drawing data from many different sources (Figure B) or as an embedded chat window within other Microsoft security services.
Security Copilot provides suggestions on what a security analyst might do next based on a conversation or incident report.
Putting AI in the hands of cybersecurity professionals helps defend against attackers who operate in a community of “ransomware as a gig economy,” Jakkal said.
What differentiates Microsoft Security Copilot from competitors, Jakkal said, is it draws from ChatGPT and it can use data from across vast numbers of connected Microsoft applications.
“We process 78 trillion signals, which is our new number (compared to previous data), and so all these signals are going on, what we call grounding the security. And without these signals, you can’t really have a gen (generative) AI tool, because it needs to know these connections — it needs to know the path,” Jakkal said.
Jakkal pointed out that Microsoft is investing $20 billion in security over five years, in addition to separate AI investments.
SEE: NIST updated its Cybersecurity Framework in February, adding a new area of focus: governance. (TechRepublic)
One benefit of Security Copilot’s conversational skills is that it can write incident reports very quickly, and vary those reports to be more or less technical depending on the employee they are intended for, Microsoft representatives said.
“To me, Copilot for Security is an absolute game changer for an executive because it allows them a summary (of security incidents). A summary of the size that you want,” said Sherrod DeGrippo, director of threat intelligence strategy at Microsoft.
Security Copilot’s ability to tailor reports helps CISOs bridge the technical and executive worlds, said DeGrippo.
“My hot take is that CISOs are a different breed of executive suite person,” said DeGrippo. “They want depth. They want to get technical. They want to have their hands in there. And they want to also have the ability to move through those executive circles as the expert. They want to be their own expert when they talk to the board, when they talk to their CEO, whatever it may be, their CFO.”
Learnings from Security Copilot private preview and early access
Principal Product Manager for Copilot for Security Naadia Sayed said that, during the private preview and open access periods, partners told Microsoft which APIs they wanted to connect to Security Copilot. Customers with custom APIs found it especially useful that Security Copilot could connect to those APIs. During the preview periods, partners were able to tweak Security Copilot to their organization’s specific workflows, prompts and scenarios.
The private preview started with using the Copilot generative AI assistant for security operations tasks, Jakkal told TechRepublic. From there, customers asked for Copilot integration with other skills — identity-related tasks, for example.
“We’re also seeing on the other hand where they want to use our security tools for governance of AI as well,” Jakkal said.
For example, customers wanted to be sure that another tool such as ChatGPT wasn’t sharing nonpublic company information such as salaries (Figure C).
“Something that we’re finding is that people have more and more of an appetite for threat intelligence to help direct their resource usage,” said DeGrippo. “We’re seeing customers make resource decisions, such as: understanding threat priority allows them to say we need to put more people and focus and time in these particular areas. And getting that level of resource usage, prioritization and efficiency has made customers really happy. And so we’re looking at making sure that they have those tools.”